Group Sparse Coding with a Laplacian Scale Mixture Prior

نویسندگان

  • Pierre Garrigues
  • Bruno A. Olshausen
چکیده

We propose a class of sparse coding models that utilizes a Laplacian Scale Mixture (LSM) prior to model dependencies among coefficients. Each coefficient is modeled as a Laplacian distribution with a variable scale parameter, with a Gamma distribution prior over the scale parameter. We show that, due to the conjugacy of the Gamma prior, it is possible to derive efficient inference procedures for both the coefficients and the scale parameter. When the scale parameters of a group of coefficients are combined into a single variable, it is possible to describe the dependencies that occur due to common amplitude fluctuations among coefficients, which have been shown to constitute a large fraction of the redundancy in natural images [1]. We show that, as a consequence of this group sparse coding, the resulting inference of the coefficients follows a divisive normalization rule, and that this may be efficiently implemented in a network architecture similar to that which has been proposed to occur in primary visual cortex. We also demonstrate improvements in image coding and compressive sensing recovery using the LSM model.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Group Sparse Hidden Markov Models for Speech Recognition

This paper presents the group sparse hidden Markov models (GS-HMMs) where a sequence of acoustic features is driven by Markov chain and each feature vector is represented by two groups of basis vectors. The group of common bases represents the features across states within a HMM. The group of individual bases compensates the intra-state residual information. Importantly, the sparse prior for se...

متن کامل

A Mixture Model for Learning Sparse Representations

In a latent variable model, an overcomplete representation is one in which the number of latent variables is at least as large as the dimension of the data observations. Overcomplete representations have been advocated due to robustness in the presence of noise, the ability to be sparse, and an inherent flexibility in modeling the structure of data [9]. In this report, we modify factor analysis...

متن کامل

Bayesian Group Sparse Learning for Nonnegative Matrix Factorization

Nonnegative matrix factorization (NMF) is developed for parts-based representation of nonnegative data with the sparseness constraint. The degree of sparseness plays an important role for model regularization. This paper presents Bayesian group sparse learning for NMF and applies it for single-channel source separation. This method establishes the common bases and individual bases to characteri...

متن کامل

Low-rank decomposition and Laplacian group sparse coding for image classification

This paper presents a novel image classification framework (referred to as LR-LGSC) by leveraging the low-rank matrix decomposition and Laplacian group sparse coding. First, motivated by the observation that local features (such as SIFT) extracted from neighboring patches in an image usually contain correlated (or common) items and specific (or noisy) items, we construct a structured dictionary...

متن کامل

Rectified Gaussian Scale Mixtures and the Sparse Non-Negative Least Squares Problem

In this paper, we develop a Bayesian evidence maximization framework to solve the sparse non-negative least squares problem (S-NNLS). We introduce a family of scale mixtures referred as to Rectified Gaussian Scale Mixture (RGSM) to model the sparsity enforcing prior distribution for the signal of interest. Through proper choice of the mixing density, the R-GSM prior encompasses a wide variety o...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2010